We investigate two new optimization problems -- minimizing a submodularfunction subject to a submodular lower bound constraint (submodular cover) andmaximizing a submodular function subject to a submodular upper bound constraint(submodular knapsack). We are motivated by a number of real-world applicationsin machine learning including sensor placement and data subset selection, whichrequire maximizing a certain submodular function (like coverage or diversity)while simultaneously minimizing another (like cooperative cost). These problemsare often posed as minimizing the difference between submodular functions [14,35] which is in the worst case inapproximable. We show, however, that byphrasing these problems as constrained optimization, which is more natural formany applications, we achieve a number of bounded approximation guarantees. Wealso show that both these problems are closely related and an approximationalgorithm solving one can be used to obtain an approximation guarantee for theother. We provide hardness results for both problems thus showing that ourapproximation factors are tight up to log-factors. Finally, we empiricallydemonstrate the performance and good scalability properties of our algorithms.
展开▼